On the Rényi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem

نویسنده

  • Igal Sason
چکیده

This paper starts by considering the minimization of the Rényi divergence subject to a constraint on the total variation distance. Based on the solution of this optimization problem, the exact locus of the points ( D(Q∥P1), D(Q∥P2) ) is determined when P1, P2, Q are arbitrary probability measures which are mutually absolutely continuous, and the total variation distance between P1 and P2 is not below a given value. It is further shown that all the points of this convex region are attained by probability measures which are defined on a binary alphabet. This characterization yields a geometric interpretation of the minimal Chernoff information subject to a constraint on the variational distance. This paper also derives an exponential upper bound on the performance of binary linear block codes (or code ensembles) under maximum-likelihood decoding. Its derivation relies on the Gallager bounding technique, and it reproduces the Shulman-Feder bound as a special case. The bound is expressed in terms of the Rényi divergence from the normalized distance spectrum of the code (or the average distance spectrum of the ensemble) to the binomially distributed distance spectrum of the capacity-achieving ensemble of random block codes. This exponential bound provides a quantitative measure of the degradation in performance of binary linear block codes (or code ensembles) as a function of the deviation of their distance spectra from the binomial distribution. An efficient use of this bound is considered.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Preferred Definition of Conditional Rényi Entropy

The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...

متن کامل

α-z-Rényi relative entropies

We consider a two-parameter family of Rényi relative entropies Dα,z(ρ||σ) that are quantum generalisations of the classical Rényi divergence Dα(p||q). This family includes many known relative entropies (or divergences) such as the quantum relative entropy, the recently defined quantum Rényi divergences, as well as the quantum Rényi relative entropies. All its members satisfy the quantum general...

متن کامل

On the Rényi divergence and the joint range of relative entropies

This paper starts with a study of the minimum of the Rényi divergence subject to a fixed (or minimal) value of the total variation distance. Relying on the solution of this minimization problem, we determine the exact region of the points ( D(Q||P1), D(Q||P2) ) where P1 and P2 are any probability distributions whose total variation distance is not below a fixed value, and the probability distri...

متن کامل

Minimization Problems Based on a Parametric Family of Relative Entropies I: Forward Projection

Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative α-entropies (denoted Iα), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual r...

متن کامل

Optimized quantum f-divergences and data processing

The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is a unifying concept in quantum information theory: many information measures such as entropy, conditional entropy, mutual information, and entanglement measures can be realized from it. As such, there has been broad interest in generalizing the notion to further understand its most basic properti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 62  شماره 

صفحات  -

تاریخ انتشار 2016